Homework 06: Unconstrained Nonlinear Optimization, and more(?) This homework might or might not contain questions regarding previous material. That's because in real life, you never know which "chapter" of the textbook any particular problem will need. It's just past the middle of the semester, and it's a good time to review our learning goals. Search for "Student Learning Outcomes" inside this file: http://people.emich.edu/aross15/math560/math560_syllabus201609.html Do you feel we're making good progress? Now that you know something about optimization, are there things you want to learn that aren't on that list? Do you want more focus on how to implement things, or do you feel like there's not much chance you'd be coding these methods anyway? On with the homework: There are 3 great ways to learn this material: i) doing things "by hand" (but using a calculator or spreadsheet or other help to do some computations) ii) using the code that I showed in class iii) using this Desmos graph (you will need to change the objective function, and maybe the gradient and Hessian) https://www.desmos.com/calculator/mgkpzuybef If you're only going to choose one method, I suggest method (i). Of course the best learning would happen by doing all 3 and comparing. It's best to compare at each little step, so errors get found early. 1. Do one step of Steepest Descent using f(vector x)=f(x,y) = 1x^2 + 4y^2, starting at x=5, y=6. Use an exact gradient computation rather than finite differences. For the line search part, use the Golden Section method and just do 2 iterations. You will need to choose your own initial alpha and bracketing method. Also do an exact line search by writing g(alpha)=f( vectorx + alpha * vector d) and graphing g(alpha) then finding its minimum. vector d is the search direction (in this case, the negative gradient). Report: a) how many times you evaluated the objective function (not including the exact line search part) b) The % improvement in the objective function from the initial point to the new point (using Golden Section) c) The % improvement in the objective function from the initial point to the new point (using exact line search) d) how different are the alpha values for Golden Section and exact line search? 2. a) At the new x vector you finished Problem 1 with, compute the new gradient. b) Is that new gradient perpendicular to the initial gradient at (5,6)? Perhaps report what the angle between them is. c) Do a new line search in that direction (using exact line search: graph g(alpha)= (as above) ) 3. Compute the gradient at (5,6) in Problem 1 by doing finite differences. Report how similar it is to the exact gradient you found in Problem 1. 4. Repeat problem 1 using h(x,y)=10x^2 + 40y^2, again starting at x=5, y=6. Use the same initial "alpha" that you used above. Comment on why this question is an interesting question, and comment on your results vs. problem 1. 5. Using f(x,y) from problem 1, do one step of Newton's method, from the same initial point. Once you've found the search direction, do two steps of Golden Section for line search. Use an exact Hessian and gradient computation rather than finite differences. Remember that alpha=1 is often our initial guess for alpha when using Newton's method. Also do an exact line search by writing g(alpha)=f( vectorx + alpha * vector d) and graphing g(alpha) then finding its minimum. Report: a) how many times you evaluated the objective function (not including the exact line search part) b) The % improvement in the objective function from the initial point to the new point (using Golden Section) c) The % improvement in the objective function from the initial point to the new point (using exact line search) d) how different are the final alpha values for Golden Section and exact line search? 6. Compare your results from Problem 1 and Problem 5. 7. For problem 5, compute the Hessian at x=5,y=6 using finite differences. Compare to the exact Hessian.